Snowflake: Uses Snowflake’s native data sharing - data stays in our account but appears as tables in yours. Zero storage costs, only compute charges.BigQuery: We replicate data to shared datasets in your GCP project. Data is copied to your region for performance.Both provide the same curated blockchain data that powers dune.com.
Snowflake: Account locator (e.g., ABC12345.US-EAST-1) - found in your Snowflake URLBigQuery: GCP Project ID and a Principal (your email, service account, or domain)We never need passwords or sensitive credentials - just sharing identifiers.
Snowflake: Self-service via Snowflake Marketplace for 30-day trialBigQuery: Email datashares-sales@dune.com with your GCP Project IDEnterprise: Visit dune.com/enterprise
  • Snowflake (marketplace access, all regions)
  • BigQuery (custom setup, US Central 1 & EU West 2)
Need Databricks, Redshift, or another platform? Contact us
Standard: 24 hours for most tables Enterprise: Hourly updates available Real-time: Some key tables update every few minutesFreshness varies by chain and table type.
Yes, identical curated data. Data types are optimized for each platform:
  • BigQuery: Uses BIGNUMERIC for large integers, BYTES for addresses
  • Snowflake: Uses VARBINARY for addresses/hashes, DOUBLE for calculations
Yes - that’s the main benefit! Query blockchain data alongside your internal datasets using your warehouse’s native SQL.
No. You query in your own warehouse environment. Dune cannot see your queries, results, or any data you join with ours.
Trials: Free (Snowflake 30 days, BigQuery by request) Production: Contact sales for pricing Compute: You pay your cloud provider’s standard rates for querying
Click “Get” twice if needed. Snowflake sometimes requires:
  1. First click: Copies data to your region
  2. Second click: Adds share to your account
Shared datasets appear in your BigQuery console under your project. Look for datasets with names like dune_ethereum or dune_polygon.